In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.
Contents |
The law of total probability is[1] the proposition that if is a finite or countably infinite partition of a sample space (in other words, a set of pairwise disjoint events whose union is the entire sample space) and each event is measurable, then for any event of the same probability space:
or, alternatively, [1]
where, for any for which these terms are simply omitted from the summation, because is finite.
The summation can be interpreted as a weighted average, and consequently the marginal probability, , is sometimes called "average probability";[2] "overall probability" is sometimes used in less formal writings.[3]
The law of total probability can also be stated for conditional probabilities. Taking the as above, and assuming is not mutually exclusive with or any of the :
One common application of the law is where the events coincide with a discrete random variable X taking each value in its range, i.e. is the event . It follows that the probability of an event A is equal to the expected value of the conditional probabilities of A given . That is,
where Pr(A|X) is the conditional probability of A given X,[3] and where EX denotes the expectation with respect to the random variable X.
This result can be generalized to continuous random variables (via continuous conditional density), and the expression becomes
where denotes the sigma-algebra generated by the random variable X.
The term law of total probability is sometimes taken to mean the law of alternatives, which is a special case of the law of total probability applying to discrete random variables. One author even uses the terminology "continuous law of alternatives" in the continuous case.[4] This result is given by Grimmett and Welsh[5] as the partition theorem, a name that they also give to the related law of total expectation.